Minimax Estimation of Maximum Mean Discrepancy with Radial Kernels
نویسندگان
چکیده
Maximum Mean Discrepancy (MMD) is a distance on the space of probability measures which has found numerous applications in machine learning and nonparametric testing. This distance is based on the notion of embedding probabilities in a reproducing kernel Hilbert space. In this paper, we present the first known lower bounds for the estimation of MMD based on finite samples. Our lower bounds hold for any radial universal kernel on R and match the existing upper bounds up to constants that depend only on the properties of the kernel. Using these lower bounds, we establish the minimax rate optimality of the empirical estimator and its U -statistic variant, which are usually employed in applications.
منابع مشابه
Minimax Kernels for Nonparametric Estimation
SUMMARY The minimax kernels for nonparametric function and its derivative estimates are investigated. Our motivation comes from a study of minimax properties of nonparametric kernel estimates of probability densities and their derivatives. The asymptotic expression of the linear maximum risk is established. The corresponding minimax risk depends on the solutions to a kernel variational problem,...
متن کاملTruncated Linear Minimax Estimator of a Power of the Scale Parameter in a Lower- Bounded Parameter Space
Minimax estimation problems with restricted parameter space reached increasing interest within the last two decades Some authors derived minimax and admissible estimators of bounded parameters under squared error loss and scale invariant squared error loss In some truncated estimation problems the most natural estimator to be considered is the truncated version of a classic...
متن کاملMinimax Estimation of the Scale Parameter in a Family of Transformed Chi-Square Distributions under Asymmetric Squared Log Error and MLINEX Loss Functions
This paper is concerned with the problem of finding the minimax estimators of the scale parameter ? in a family of transformed chi-square distributions, under asymmetric squared log error (SLE) and modified linear exponential (MLINEX) loss functions, using the Lehmann Theorem [2]. Also we show that the results of Podder et al. [4] for Pareto distribution are a special case of our results for th...
متن کاملEstimating a Bounded Normal Mean Under the LINEX Loss Function
Let X be a random variable from a normal distribution with unknown mean θ and known variance σ2. In many practical situations, θ is known in advance to lie in an interval, say [−m,m], for some m > 0. As the usual estimator of θ, i.e., X under the LINEX loss function is inadmissible, finding some competitors for X becomes worthwhile. The only study in the literature considered the problem of min...
متن کاملMinimax Estimation of Quadratic Fourier Functionals
We study estimation of (semi-)inner products between two nonparametric probability distributions, given IID samples from each distribution. These products include relatively well-studied classical L and Sobolev inner products, as well as those induced by translation-invariant reproducing kernels, for which we believe our results are the first. We first propose estimators for these quantities, a...
متن کامل